Meeting

Countering Disinformation Through Media Literacy

Tuesday, November 29, 2022
oatawa / Getty Images
Speaker

Chief Program Officer, PEN America

Presider

Adjunct Senior Fellow, Council on Foreign Relations

Introductory Remarks

Vice President for National Program and Outreach, Council on Foreign Relations

Summer Lopez, chief program officer of PEN America’s Free Expression Programs, discusses the psychology and spreading of disinformation and how to avoid injecting it into public discourse. The webinar will be moderated by Carla Anne Robbins, senior fellow at CFR and former deputy editorial page editor at the New York Times.

TRANSCRIPT

FASKIANOS: Thank you. Welcome to the Council on Foreign Relations Local Journalist Webinar. I’m Irina Faskianos, vice president of the National Program and Outreach at CFR.  

CFR is an independent and nonpartisan membership organization, think tank, publisher, and educational institution, focusing on U.S. foreign policy. CFR is also the publisher of Foreign Affairs magazine. As always, CFR takes no institutional positions on matters of policy. 

This webinar is part of CFR’s Local Journalists Initiative, created to help you draw connections between the local issues you cover, and national and international dynamics. Our programming puts you in touch with CFR resources and expertise on international issues, and provides a forum for sharing best practices. The webinar is on the record. We will make the video and transcript available to all of you. It will be posted on our website at CFR.org/localjournalists.  

We are pleased to have Summer Lopez and host Carla Anne Robbins today with us, to talk about “Countering Disinformation Through Media Literacy.” 

Summer Lopez serves as the chief program officer of Free Expression Programs at PEN America. Previously, she served as deputy director of the Office of Democracy, Human Rights, and Governance at the United States Agency for International Development. Ms. Lopez was also vice president of operations at the AjA Project, a nonprofit organization that provides media-based programs for refugee, displaced, and immigrant youth in the U.S. and internationally.  

Carla Anne Robbins is a senior fellow at CFR. She is a faculty director of the Master of International Affairs program and clinical professor of national security studies at Baruch College’s Marxe School of Public and International Affairs. And previously, she’s—she was deputy editorial page editor at the New York Times, and chief diplomatic correspondent at the Wall Street Journal.  

So, thank you both for being with us, and having this conversation. I am going to turn it over to Carla to get us started. 

ROBBINS: Thank you, Irina. And thank you, Summer, so much for doing this. And thank you to—for everybody for joining us. And thank you to all the journalists on this, and for the work that you do. It’s a difficult and challenging time with the 24/7 news schedule, and always in awe of the work that you’re doing.  

So, Summer—first, the way we’re going to do this, Summer and I are going to chat for thirty minutes or so. If you guys have a lot of questions before that, you know, throw them in, and we’ll talk to order, and could throw it open to you after that or before that. 

So, Summer, before the midterms, the OSCE, the European-based election monitoring organization, issued a pretty chilling report about the United States political system. You know, we’d all read OSCE reports about, you know, developing countries, but suddenly, they were writing about us. And they were warning about threats of violence against election officials, potential voter suppression and voter intimidation, the level of election denialism among GOP candidates, and they warned about election misinformation.  

Luckily, we didn’t see the voter suppression and violence. Many of the key election deniers lost, especially those who can affect the next round of voting on the state level. But how good was the misinformation campaign in the lead-up to the midterms? Was there the deluge that many feared? And if it did take place, what were the topics? 

LOPEZ: So, I think that there wasn’t sort of the deluge that people necessarily feared, but I’m hesitant to say it wasn’t really an issue, you know? And I think what we have been hearing from a lot of journalists, and community activists we’re engaged with on the ground, is that essentially, the main issue was still the 2020 election, that the big lie was really the narrative that continues to be pushed going forward, and sort of stoking doubt in issues like mail-in ballots, and sort of the process of the elections themselves. And some of those kind of narratives that had initiated in 2020 were just kind of continued and magnified. 

And so, I think that is still a lot of what we’re seeing, and I don’t think that’s going away necessarily. Obviously, you know, the sort of outcome in, particularly, some of the secretary of state elections around the country are reassuring, in terms of some of the election deniers not being in a place to make decisions about the elections. But I think the narrative and the attempts to kind of stoke doubt in the process are definitely still there, and will likely ramp up as we go into 2024. 

I think one of the other things we’ve really noticed more in the past year or two, in particular, is that a lot of the disinformation isn’t happening in this sort of very visible way, that maybe it had been previously, where a lot of things were really happening on Facebook and Twitter, sort of, you know, memes and messages that were pretty visible, and people could see a lot of it. What we’re hearing from a lot of folks we’re working with is really that so much disinformation now is also taking place in encrypted, you know, WhatsApp chats, family chat groups, that information is spreading in ways that are not always entirely visible, either to the media, to researchers, or to really, to anybody. And so, I think it’s also just hard to kind of really assess the scale of what’s happening right now, because a lot of it is happening kind of behind the scenes, in some ways. 

ROBBINS: So, the main topics of misinformation that you’re hearing from journalists about, is it all COVID, all the time? Big lie and COVID, are those the two main topics? 

LOPEZ: Big lie, COVID, we heard quite a bit about some issues around the raid of Mar-a-Lago, and sort of the questions around the role of the FBI, and is the FBI compromised, had become kind of a significant topic, quite a bit around the Dobbs decision, and Roe v. Wade, and sort of just using some kind of really significant political issues, you know, as kind of wedge issues to stoke tension.  

You know, I think disinformation can be about projecting a particular narrative, but it can also just be about, you know, stoking people’s doubt in narratives and in institutions overall. And I think that’s a lot of what we’re seeing right now as well, and particularly, as I said, in sort of the institutions and processes related to elections and public health, unfortunately. 

ROBBINS: So I spent some time with your very useful 2021 survey of reporters and editors on disinformation and newsroom responses, and I’d like to ask you some questions about that. We shared it, and we will—we’ll share the link with everybody again.  

So, first of all, I was struck by how much reporters and editors said disinformation had changed their approach to work. And according to the report, more than 90 percent of the people you guys interviewed had made one or more changes in their journalistic practices, as a result of disinformation.  

You know, I’ve done a lot of writing about disinformation, but I thought back to the days in which—and I don’t want to make myself sound like I’m, you know, on the verge of death here—(laughs)—but it’s been a while since I worked in a newsroom—you know, I—writing misinformation, yes, but I hadn’t really thought about how it might change my own work.  

So, what did you hear from reporters and editors about how it changed their daily work? 

LOPEZ: Yeah, I mean, I—to some extent, I think it’s sort of a practical question, in that it adds, you know, time. People feel that they have to put more effort into, you know, fact-checking information, thinking about the sources that they’re utilizing, and to looking at, you know, is this photo legitimate? Is this source legitimate? Is somebody trying to get me to report something that’s false, providing false information in an attempt to call—you know, catch me out on something.  

And so, I think there’s a lot—there’s both sort of the practical time that that takes. There’s also sort of the added emotional burden and stress of feeling that that’s a possibility. You know, I think we had 11 percent of respondents who said they had, at some point, accidentally reported false information. So I think that sort of consciousness of the fact that people are trying to manipulate journalists and the media as well, is adding stress and burden. 

You know, then I think there’s also sort of things that people see they need to do more of, to respond to disinformation, so more kind of public engagement, community engagement, you know, spending more time kind of explaining how reporting is being done, which I think is really important. I think, you know, it’s part of—our media literacy work has always included a strong focus on just understanding a little bit more about the practice of journalism, and, you know, so that people don’t get manipulated into thinking, you know, that an anonymous source inherently means something can’t be trusted, or is false, right? Sort of just understanding a little bit of what goes into professional journalism, and that piece of it, I think, is really critical for the public.  

But I think, you know, journalists and newsrooms feel like that’s something they need to spend more time doing, and just doing kind of the basic explanations of both the practice of journalism, and also some of the things they might be reporting on, in addition to, essentially, the kind of traditional reporting that they would be doing. 

ROBBINS: So that’s something salutary about it, is that this sense that somehow—I just assume you trust me, because I am X newspaper, is—and as you know, there’s been a general decline—you know, a huge decline across the board—of trust in institutions. But if we as journalists, and as reporters, and as editors, are saying, we have to explain more to our—to our readers, how we made a decision, without it being overly navel-gazing—and how you find the balance there is an enormous—(laughs)—one, without sounding defensive. I mean, there’s something hugely salutary about that. And the difference between a hard-copy paper and the internet is that you actually have space to do it. And so I think there is something salutary. 

Now, did you find that there were a lot of—you know, that this has become a new standard among news organizations, that just basically explaining how they made news decisions has become increasingly the standard? 

LOPEZ: I don’t know if it’s quite the standard yet, but I think there is a lot more recognition that it has to be a part of the job. And you know, I think we’ve seen that in a lot of different ways. I mean, I was just noticing, I think, last week, during one of the horrifying mass shootings we’ve had just recently, you know, that if you are—if you’re following it on the New York Times’ live feed, every so often, a message will pop up that explains how they report on breaking news stories, and that they may rely heavily on policy information at the beginning, but they may make adjustments as they—you know, as that information becomes clearer, and as they speak to additional sources. And I—you know, I thought that was really helpful, honestly.  

And we’ve done some work with the Texas Tribune. You know, they had a whole kind of webpage that just sort of explained how they were going to be reporting on elections. And that included a box about how they report on disinformation, including the fact that sometimes, they won’t report on disinformation, if they sort of assess the reporting might amplify something that might not find that much of an audience otherwise. 

And so, you know, I think those kinds of explainers are becoming more common. And I think they—you know, I do think that’s actually a very good thing, in part just because, you know, I think we don’t have the kind of civic education in this country that we used to, that kind of walks everybody through the process of, you know, how the news gets made in high school, or something, that would—a lot of that has disappeared. And how that happens has also changed as the internet has come into play, and the way that news develops has changed too. 

So, I do think that that’s, you know, a little bit of a silver lining out of all of this, is a bit more transparency about that, and public education, too. 

ROBBINS: So, we were talking before we started about the CDC and the confusion about COVID, and so many of the rules. But we’re generally—Irina was talking about this expectation that somehow, we were all going to get shots and that nobody was going to get sick. And then we got sick anyway. And for those of us who nevertheless believe, or have some faith, in institutions, we didn’t come away and say, oh, well, it’s all a big, you know, lie, and I’m not going to get a shot because of this. And—you know—(laughs)—I can’t even count the number of times I’ve been vaccinated, and I still got COVID. But that doesn’t mean I’m not going to get the next booster that they offer. 

But how much of that do you think was a failure of the CDC, which really had really bad communications, I think, in the Trump administration? And I think Walensky is doing a better job, but not a great job right now. But how much of that was a failure of the way science was reported by newspapers, in the coverage of COVID? And how much of it was inevitable because this was new, and we just didn’t know? 

LOPEZ: Yeah. Well, I think—I think the answer is both of those, because I think, in part, there wasn’t enough explanation of the fact that this was new, and—(laughs)—we couldn’t possibly know all the answers, and that the vaccines are new.  

I mean, when I had COVID earlier this year, I took Paxlovid, you know, and the little insert was very short, because they didn’t know a whole lot about this drug. And they were pretty honest, too—(laughs)—you know, only so many people have taken this, so we only know so much.  

You know, I think that—so I think that was a failure, you know, in terms of how this information was communicated with, you know, being honest about the degree of certainty that existed, and why it might still be really important to get the vaccine, or to take seriously the warnings that were being given, because these were cautionary measures that were worth—you know, worth the potential bits of it that maybe we weren’t completely certain about. 

And I think one of the things that we hear a lot—and that I know there has been some research on—is, you know, people talking about feeling like, you know, even if you just sort of have questions about something, sometimes, the media or sort of society as a whole might dismiss you as a conspiracy theorist, or an extremist, or just being dumb, or whatever it might be. And so people feel like, you know—I think there’s sort of the middle of society, who, you know, isn’t sort of fully enveloped by conspiracy theories, but also, you know, has some questions about things. And feeling like those questions are dismissed can, you know, drive people away from those institutions, and from outlets that they feel might not be, you know, sort of acknowledging that they may have some legitimate doubts, or they may not really understand, you know, how the vaccine was developed. Or they may not really understand the electoral process, and what it means to do absentee ballots. 

And so, I think a lot of it is also about just, sort of, treating, you know, our audiences—whether that’s the CDC, or the media—treating the audience with respect, and kind of acknowledging that, you know, we live in a really confusing time, honestly. (Laughs.) And there’s a lot of anxiety about a lot of different things. And so, you know, I think starting from a point of empathy and respect is really important, in terms of how we try to bring people—you know, to build trust, and to—and to counter some of the disinformation narratives that are out there, because there’s also research that, you know, when people feel more anxious, they will also just cling to information that is more aligned with what they already believe. And so, you know, I think that’s kind of a level of anxiety that we’re just living with at this—in this moment, and have been, especially for the past two and a half years, is contributing to making people more vulnerable, or more susceptible to disinformation. And so, you have to really think about that as a factor, when we think about our messaging, too. 

ROBBINS: So, the other key takeaway from your survey was that only 30 percent of the journalists said their news outlets—outlet had generally effective processes in place to cope with disinformation, and 40 percent said no organization-wide approach exists. OK, my first reaction was, as a masthead editor, oh my God, they just whine. But—(laughs)—the more I thought about it, you—none of that is surprising, but I came away realizing that I wasn’t sure what, quote, effective processes are, and if there is now a consensus about the best way to push back against disinformation.  

And so, you have a list in your report. And they go from, you know, some technical things, to best practices. And so, I wanted to ask you first about the technical things. So, you have this list here of bot detection devices, image verification tools, social media monitoring tools, reverse-image searching tools, using fact-checking sites. You know, is this something that should be assigned to a desk? To individual reporters? Should we be training every one of them in this? I mean, these sound—some of them sound expensive. Some of them sound very technical. It sounds like something that we used to assign to librarians to do, or ask librarians to do. Or you know, I mean, how do—how does—how does a smaller newspaper, you know, master something like that? 

LOPEZ: Yeah, I mean, I think—I think consensus is probably a strong word at this point, for kind of what are necessarily the best practices, and also recognizing that the same things are not going to work for large newspapers as they are for smaller, local outlets, that don’t have the same resources or the same staffing. So, you know, I think that’s part of what we’re trying to kind of work on, coming out of this report, is to consult with more journalists and editors, you know, on the findings, hear more from them about, you know, what—and I’d love to hear from folks in this conversation today too, you know, about what you feel would be useful, because, you know, we would like to, kind of, develop additional resources and programming to support newsrooms in thinking about what they can do, especially those that might not have the resources to do things easily on their own. 

Some of these tools, though, are relatively straightforward and free. And so there are—you know, some of this is pretty low-hanging fruit, right? I mean, reverse-image search is something you can do very easily on Google. You know, bot detection, there are some websites where you can just, you know, put in a Twitter handle, and it’ll tell you the likelihood that it’s a bot or not. There are, you know— 

ROBBINS: We want—we can—you will share this with us. And we will share this with everybody here. 

LOPEZ: Yes. 

ROBBINS: OK, go ahead. I didn’t mean to interrupt you—  

LOPEZ: No. (Laughs.) It’s fine. 

You know, and there are—there are some websites that do things like, you know, track disinformation narratives online, or where you can search for things. There’s one in particular—I will find the name of it, and definitely share it with folks—you know, that was created by researchers, specifically for journalists, to help them be able to track and record disinformation, because, you know, sometimes, people will see something they want to report on, and then the thing gets taken off the internet, as well. And so you kind of lose the original content. And then, you know, they kind of track, you know, the way that narratives might be moving around the internet, as well.  

And so, there are some relatively simple and straightforward resources out there. I think one of the things we were struck by was that even that, you know, wasn’t something that most people felt they knew how to make use of.  

And so, I think there’s a lot of pretty straightforward education and resource-sharing that can be done. But then, you know, some of these things are obviously more complicated. Not every newsroom is going to be able to have somebody on a dedicated disinformation beat. And that might not make sense for everybody. 

So, you know, I think some of it is going to have to be quite tailored to thinking about what makes it an—it an individual newsroom, but at least, you know, making some of these resources available, and thinking about it as a thing that is affecting, you know, basically every journalist, I think, is a really important first step. 

For us, a lot of this is also based on work we’ve been doing over the past four years, about online abuse and its impact on journalists and writers. And that’s—a lot of our work on that was initially just kind of basic training for journalists—you know, here are some ways to keep yourself safe online. Here are ways to think about, you know, how you can respond if you experience online abuse, how to, you know, report it and protect yourself. That really transitioned into working more with newsrooms on sort of what were institutional best practices, and what they could put in place to support and protect journalists, as more of an institutionalized thing. 

But again, that’s very tailored too. And we really—we work with individual newsrooms to think about what is needed, and what will work best for them. But recognizing that these are things that kind of have to be thought about at that level, I think, is really critical. 

ROBBINS: So, you have this list of questions about whether—has your new outlets taken any of these actions. So I assume that means that you think that these are actions that would be better practices, if not best practices.  

Put more emphasis on choosing headlines, leades, and photos that minimize their potential misuse as disinformation. So, how do I write a better headline and leade, and choose better photos, that minimize the potential misuse of disinformation? I wrote a lot—I’ve written a lot of headlines in my life.  

LOPEZ: (Laughs.) Well, you know, I think part of it is, you know, thinking about—there’s an—there’s an example in the report, actually—I’m not going to remember it off the top of my head, you know—but about, you know, headlines that don’t necessarily state very clearly what is actually happening, or leaves some room for interpretation. Sometimes, I think, you know, headlines that get drafted with a little bit more of a clickbait mindset could potentially be misrepresented. And you have to remember that, you know, most people, obviously, are kind of skimming through their social media, and they might see just the headline. So, I think a lot of it is just remembering that, you know, it isn’t that people are going to see the headline, and then read the whole article. The headline might be all that they ever see of this information. And they—if they only interpret that, you know, how is that going to kind of lodge itself in their mind? 

So, you know, I think some of it is really just a little bit of extra consciousness about that, and about how people are consuming information, and what information you’re kind of putting upfront as what people are going to be most, you know, most kind of struck by. 

You know, and same for photos. Obviously, every photo could potentially be manipulated. (Laughs.) But again, you know, what’s the photo that’s going to be the leade associated with this article? If that’s all people see is the Tweet about the article, and the photo that comes up at the top, you know, just being conscious about what that is, and what message it sends about the—about the article itself. 

ROBBINS: So, I think the example that you—in the report was a story in the Chicago Tribune with the headline, a, quote, “healthy doctor died two weeks after getting a COVID-19 vaccine. CDC is investigating why.” 

LOPEZ: Right. 

ROBBINS: And it turned out that far fewer people saw the follow-up news that after an autopsy the doctor’s death was attributed to natural causes and not the vaccine. But you can see how something like that, you know, would be shared wildly. 

LOPEZ: Exactly. Exactly. 

ROBBINS: But this is a—you know, these are—these are sort of—so, other best practices that you think that, you know, that—I mean, I can go through your list of questions here, all of—you know, your list of, you know, implied best—you know, better practices, that completely intrigue me. You know, implement changes to attract and hire journalists to ensure a wide variety of perspectives. That’s—obviously, has many, many advantages. You know, have systems in place to respond quickly to disinformation. That’s a hard one, as you said, particularly as things move away from—you know, I remember when I was at the Times, and—(laughs)—God, I remember when we first started driving cars—(laughs)—that, you know, suddenly, we had people who were monitoring Twitter all the time, and getting stories off of—off of social media. But, you know, the notion that things are moving into encrypted platforms, that becomes a change, and it becomes a real challenge itself.  

What sort of systems should organizations, if they can do it, have in place to respond quickly to disinformation? 

LOPEZ: Yeah, well, I think—I mean, it kind of comes down to the fact that we talk about both pre-bunking and debunking disinformation. So obviously, there’s sort of how you can debunk disinformation that’s already out there, you know, and that, too, requires some assessment. As I said, you know, is this disinformation relatively isolated? Is it not going to, you know, reach that large of an audience? Do you risk amplifying it by saying anything about it? Maybe you don’t need to respond to it at all. But then, thinking consciously about how you do, if you do, debunk. 

I think, you know, what we hope journalists and newsrooms will get better at is the—is the pre-bunking piece, and kind of anticipating where disinformation is likely to occur, you know—and thinking about what sort of information can be provided proactively. This goes a little bit to the—to the question of, you know, more explainers, again, about—not just about sort of the practice of journalism, but also about the issues that are most likely to be contentious.  

So, you know, talking to people—in the 2020 election, we did a lot of work, you know, thinking about how to head off disinformation around an election that was going to be unusual, right? There was going to be a lot of absentee voting, it was going to take longer to count the ballots. And so a lot of, you know, what we were really pushing for was a lot of messaging about the fact that people needed to know that it was not likely that we were going to have results on election night, and that that was going to be OK—(laughs)—you know, there was a reason for that. Here’s how the process works, you know, here’s how elections get called, and really kind of prepping people for that, so that, you know, ideally, they’re then more resilient when disinformation narratives are coming at them, and telling them that this is—means the election has been stolen. 
You know, and so I think there’s a lot of areas where that can be really valuable. And I think the—you know, again, kind of what we’ve heard from a lot of journalists, I think, is that they find a lot of hunger for that, and just sort of issue explainer information. There was a journalist we—I spoke with at a symposium I was at a few months ago, who was talking about, you know, that one of the issues that was really relevant in their community was about rent. And so, they did a whole explainer about sort of who had jurisdiction over rent in local government, and how you could reach out to people about it. They said it was the most popular article on their website for a month, because that was the kind of information people really just felt they needed, and that sort of thing, you know, on any host of issues can help people, you know, be more informed in advance of disinformation kind of coming at them. 

So, you know—and then I think the other piece kind of goes to building community trust as well, and a lot—you know, I think there’s a lot of need for community engagement, especially for local outlets. And again, this is tough, because it requires time and effort. But being out in the community more, again, sort of explaining sort of your role in the community as a local journalist and a local outlet can really, again, kind of help build up that trust, so that when disinformation occurs, you know, your outlet is looked to as a trusted source. And I think that a lot of it is kind of about laying the groundwork for that, so that people—so that there’s a bit more of that resiliency in place in advance. 

ROBBINS: So, I want to turn it over to the group. I’ve got a lot of questions about the resources that exist out there, including the resources that you guys have. But I—we do have one question in the Q&A, we’ll start with that one. But please, I want to—if we can remind everyone how to ask a question, can we—can we do that? 

OPERATOR: Yes, as a reminder— 

ROBBINS: Oh, thanks, Audrey. 

OPERATOR: (Laughs.) No, no, just going to say.  

(Gives queuing instructions.) 

ROBBINS: That’s great. Thank you so much. 

So, we have Mark Lewison asking a question. Mark, do you want to ask the question yourself, or should I read it? 

I will read it. I’m good at reading.  

Many of my college journalism students still treat Dem-GOP sides as equals, and both have legitimate political perspectives to cover in every story they write. They smile at, and forgive the disinformation, and then pretend the GOP is honorable, and worthy of, quote, equal-time coverage. What can we tell these future journalists about, quote, fairness? 

You know, in their defense—I mean, it took me a very, very long time to get past the good people on both sides argument.  

LOPEZ: So, I’m really glad you asked this question, because, actually, we just put out another report just a couple of weeks ago on, basically, this very issue, and really looking at how journalists have been reckoning with the question of, you know, how do you report—do political reporting in particular—but, you know, when one of the sides that you are reporting on, you know, represents a lot more extremist views than it used to, has a lot of, you know, candidates who are, again, election deniers or, you know, expressing white nationalist rhetoric, and things that, you know, are kind of outside the norms of political discourse, at least as they have existed for some time.  

And I think, you know, our sense—we interviewed seventy-five journalists and others for this report. And you know, our sense was really that people are thinking about this a lot—(laughs)—and then that things have changed, you know, quite a bit over the past six years. You know, I think looking at even the reporting, you know, around Trump’s announcement that he was running was quite interesting, you know, in terms of looking at sort of the headlines, and how people brought in, you know, the fact that he had investigations against him that were active, that he had, you know, played a role in stoking an insurrection on January 6th, and sort of, you know, didn’t—I think didn’t kind of give in to some of the temptations that existed earlier to kind of, you know, take advantage of the hubbub that he—(laughs)—created, and that can, you know, can be useful, to some extent, in terms of generating views. 

So, I think, you know, there is a lot of consciousness about this. I think—you know, I don’t think that anybody really believes that, you know, we should be abandoning principles of objectivity and of bringing all sides to a debate to bear. We certainly don’t think that’s a good idea either.  

But I do think, you know, there’s a quote in that report that says something like, you know, that doesn’t mean that—you know, just being fair doesn’t necessarily mean that everybody gets an exactly equal perspective and time, or that you pretend that everybody is exactly the same, right? That it’s OK to acknowledge, you know, that this person was involved in an insurrection, or is an election-denier, or, you know, has some affiliation with an extremist organization. You know, that that is not something that should be sort of left out or that that shouldn’t be a factor in how you report on what they’re saying. 

And this really connects to the disinformation piece, because a lot of problem—you know ,the problem is also that a lot of those same folks are the ones spewing a lot of disinformation, and from positions of power. Which, of course, makes it, you know, more impactful and more challenging to undo its impact. And so I think, you know, again journalists being kind of prepared to go into interviews equipped with the facts, so that they can counter false statements, making sure that they, you know, include that context in the reporting.  

A lot of what the report looked like was also sort of the—some of the reporting around prominent white nationalists some years back, Richard Spencer and other folks like that, you know, that kind of took an attempt to humanize them and sometimes didn’t fully include all of the context and associations and viewpoints that they represented as starkly as it could have. And so I think, you know, that’s—there was a lot of reflection from the journalists we talked to about how some of that reporting needs to and has evolved.  

But I think it’s—you know, one of the conclusions we kind of came to was that political reporting has become extremism reporting, to some degree. And extremism reporting is a particular thing—(laughs)—that requires some—you know, some particular knowledge and preparation as well. And so I think there’s a lot of lessons to be learned from reporters who have been covering those types of beats for a long time, and thinking about, you know, how we—how that gets a little bit more integrated to reporting more broadly. 

ROBBINS: So Matt Rodewald was a comment here. Matt, as an editorial writer, I’m going to respond to you. I think—I think this is a truncated question. But would you like to—would you like to read your comment, or? Well, Matt wrote: This is the problem right here. You automatically assume the GOP is bad. Why aren’t the Dems bad? Who’s talking to independent voters? What about Republicans who don’t associate with Trump? 

LOPEZ: Well, and I think this gets to my point earlier about the fact that you don’t want to be alienating your audience either, right? And alienating in particular the folks who very much are in the middle of society right now, and feeling, I think, very lost, to some extent, between a pretty polarized debate. And so, you know, I think there’s—you know, there’s a realization that there are, you know, a very significant number of GOP candidates this past election, you know, had expressed denial of the outcome of the 2020 election. You know, that, to me, is concerning, and that’s a little different than sort of a policy debate.  

But that doesn’t necessarily—that doesn’t mean you can dismiss, you know, an entire one of our two major political parties, by any means, or that you don’t want to still talk about those policy issues. So, you know, how do you balance these things? It’s, obviously, very tricky, but I think very much—you know, there is a risk to coming across as, you know, dismissive of one entire side of the political spectrum and of people who are really struggling, you know, to figure out what is their space within such an extremely polarized political dynamic? And so I think that that’s really critical as well. 

And recognizing that people have legitimate questions, you know, that people do want both sides to be challenged and held to account, as they absolutely should be in a democracy, and that, you know, people need to feel like journalism is taking a sort of responsible approach to that. Or else, you know, it could also further stoke doubt and distrust too. 

ROBBINS: And I do—Matt, I think you do have a very legitimate concern here. But I also do think that the responsible coverage is quite clear about where people sit in their perception of the elections—that they’ve lied, their allegiance to Trump or not to Trump. You know, that’s—I think people make pretty clear distinctions on that when they talk about the Republican Party. And—(laughs)—my husband covered the Hill for years for the Washington Post. And over breakfast every morning I’d ask him the same question: You know, all these people who we knew when we covered—you know, do they really believe the things that they say? You know, I don’t think we can make that judgment for them. And Matt and Mark are duking it out in the Q&A right now. (Laughs.) But we can’t make assumptions about what people believe. All we can do is work on what it is that they say. And that’s our responsibility as reporters. 

Which does go to another question here, which is the truth sandwich question, which I wanted to ask you about. But I also wanted to ask you, you know, in a broader context here, when I was on the edit page at the Times we agonized overusing the word “lie.” Just absolutely agonized over it. And we just—and this was on an editorial page. Because we said to ourselves, to say someone’s lying means that we know what’s inside their soul, we know what their intention is. So we would use things like “misspoke” or “prevaricated.” And we just came up with all these things. And I think we finally decided that Dick Cheney was lying about Iraq, OK? (Laughs.) And we just—it was just an agony to finally do it.  

And then—and then suddenly—you know, then after the Times, on the news side, came out with, you know, Trump lying about the election, and then Trump lying—something about Hillary Clinton and, you know, undocumented voters being bussed in from New Hampshire—or, to New Hampshire, or something like that. And they used it twice in a very short period of time. And I remember I wrote a piece about this, about how we had agonized, and how it migrated to the news side, and this question about calling a lie a lie was an important thing. On the other hand, did it lose its meaning if you used it too often? 

Now, we use the term “lie” all of the time because people are lying. And I’m not saying—I’m not making a judgment here about which party here, Matt. I’m just saying that people lie, that we should call out when they’re lying. That said, how do you deal with the more general issue of issues that are not true? I mean, do you subscribe to the truth sandwich issue? Do you say, of course, the election wasn’t stolen. So-and-so said it was stolen. Let me remind you, it wasn’t stolen. I mean, what’s the best way of covering that? 

LOPEZ: Yeah. I mean, I think—I mean, first of all, I think I agree that it is important to call an obvious lie a lie. But I do think that there’s a risk to using it too much and that that, you know, that should be somewhat reserved for things that are quite clear. And there are those things, right? So I don’t think it’s, you know, impossible to say that. But I do think it’s important that it be something that people don’t feel is getting bandied about kind of recklessly either. Because, again, I think that will undermine some degree of trust. 

You know, I think the truth sandwich which, you know, just so people know, is you kind of you state the true thing, they you acknowledge the falsehood, then you restate the true thing, so that it’s kind of captured safely within the context of truth. You know, it can be very effective. Obviously, that’s a good kind of shorthand. But I do I think there are cases where, you know, I’m not sure that every article right now about election denial needs to kind of acknowledge, you know, well, some people believe X. You know, but it can also just kind of go to how it’s presented, right?  

This isn’t a question of, you know, some kind of giving credence to this as a belief that the election was stolen. I think stating there is zero evidence that the election was stolen, nothing has ever demonstrated that there was any manipulation of this election, is an important thing to keep repeating. And I do think that, you know, one of the reasons that disinformation works is just because our—the way our brains are wired, and it manipulates the way our brains are wired. So, you know, the more that you hear something, the more likely you are to believe it’s true. Even if you start out knowing darn well that it’s not, it just kind of becomes more normalized the more you hear it.  

And that’s something that, you know, purveyors of disinformation very much use. So I think it’s something that we should use in reverse as well. (Laughs.) And so continuing to repeat the facts about a situation even if it feels like maybe they’ve been overstated, I think, is really critical. And so, you know, I think the truth sandwich is not a bad shorthand, but it might not necessarily be exactly the right approach in every case. 

ROBBINS: So you got—I mean, we have a lot of journalists on this, who are not asking questions. Come on, you guys. Ask questions! I mean, I think the question that we want to hear from you is: What help can Summer’s group provide for you? And for resources so that we can make it easier for you? I mean, newsroom assets that journalists told PEN should be developed include a database of exerts organized by topics that reporters can turn to for help in debunking disinformation, mechanisms for collaborating across news outlets, for—let’s face it, most news organizations are under-resourced these days. And you’ve got PEN, you’ve got Summer here. And they do have resources. They already do have things that Summer can describe of what’s already out there. But, you know, she’s here. Tell her what you want. 

You want to talk a little bit about your resources while people think about what they want for Christmas from you? 

LOPEZ: (Laughs.) Sure. Well, let me just say, a little bit of the work that we’re doing right now. So there’s work we’re going to be doing to kind of develop resources. And we’re hoping to develop sort of an online hub that will pull together a lot of the existing resources that are out there into one place that’s very accessible, as well as other things that we might develop. You know, we’d like to do sort of some short video explainers and things so that, you know, if you want to make use of, you know, this bot detection tool, or whatever, you have kind of an easy way to figure out how to do that. 

You know, the other part of our work right now is kind of looking at engaging in communities that are particularly targeted by disinformation and working with, you know, trusted figures within those communities who can be sources of resilience and sources of credible information for people. So local journalists, faith leaders, librarians, educators, community leaders, and kind of helping equip them with tools as well and connecting them to each other too.  

And so, you know, I do think that one of the things we did in the runup to the 2020 election was to kind of hold some town halls, virtual town halls, that were, you know, bringing together some of those folks within different communities to talk with the public about, again, what the election was going to look like, what each of their role in that process was going to be, and answer questions. And, you know, those were actually quite well-attended. And I think there is, you know, a lot of, as I said, just kind of interest in understanding what the process is going to look like and what different people within a community, what their role is going to be, and who people can kind of go to if they have questions. 

I mean, disinformation is also—you know, debunking is more effective, and fact-checking is more effective, if it’s coming from people that the audience already trusts and that they identify with. And so if you can, you know, bring people into your efforts to fact check and debunk disinformation, who are part of the communities you’re serving and who are, you know, trusted voices, that can be much more effective. So one of the things we’re trying to do is kind of help foster some of those connections in places where, you know, people are really working on a lot of these issues already, but may not necessarily always have a chance to talk to each other about it.  

And that work right now is focusing in Miami and South Florida, and Austin and Dallas-Fort Worth in Texas, and Phoenix in Arizona, which are all places PEN America also has chapters, some existing engagement and presence. So would love to hear people’s thoughts on any of that work as well. And then, as I said, we have a lot of—a tremendous amount of resources about online abuse, which we’ll also share both for individual journalists and for newsrooms to think about. How to be safe online, and increasingly, you know, offline we’re seeing a lot of that shift into offline intimidation and harassment as well, unfortunately. 

ROBBINS: So that’s great. We have a question, and it’s a question I would love to hear the answer to, from Julie Anderson, who I gather is the editor of the Sun Sentinel in South Florida. Ms. Anderson, do you want to ask your question? 

Q: I’m also the editor of the Orlando Sentinel. And my—especially in Orlando, my audience is very different than South Florida, which is very liberal. But my audience in Orlando is—it’s blue, surrounded by red. And I get letters almost every day from readers who say, you know: How come you’re not covering Hunter Biden? How come Hunter Biden isn’t getting the same treatment as, you know, I don’t know who. But they want it on the front page every day. And so I’m very—(laughs)—wary about this—you know, it’s going to be a storyline coming around again, it seems like. So how do you think the media can handle this better this time around, without covering the wildest conspiracies? 

LOPEZ: Right. No, I think it’s a great question, and really challenging. You know, I do think that it is one of the opportunities to, you know, maybe get ahead of the story, in a way, and talk people through, you know, what are the facts of this story? You know, this is something that is on people’s minds, that people do have questions and concerns about. And so, you know, an explainer kind of laying out what has actually happened in this story from the beginning, you know, anticipating that it’s likely to come around again, could be very effective.  

And I think explaining, again—you know, even explaining some of the disinformation narratives around this. I think this is another thing that, you know, some research has found. I think a lot of this research is pretty nascent, so acknowledging that—(laughs)—as I suggested others should earlier. But, you know, there is research that shows that if you kind of walk people through how disinformation is often taking a kernel of truth and then manipulating and twisting it into something false. But beginning with something that may be, you know, a legitimate piece of fact, and how that is manipulated and why that might be being manipulated, and who might be behind that manipulation—to the extent that you can kind of map that out for people, which is not necessarily straightforward, but that can be very effective. 

Because people don’t like to feel like they’ve been duped either, right? And I think, you know, back when a lot of the disinformation was coming internationally and then some of it, you know, being stoke by the Russian government, you know, I think there was—when a lot of that was kind of exposed as being, you know, farms of disinformation creator who were, you know, paid by the Russian government to manipulate Americans and divide our country, you know, I think there was a real sort of sense that, oh, like, that’s—you know, understanding why that might be happening and how it was happening, you know, I think broke a little bit of the effect of that, to some degree. 

It's more challenging, I think, when it’s happening domestically, because people’s own kind of political identities are bound up in it. But I do think that kind of just walking people through the facts of the story—you know, how some of those facts have been manipulated into falsehoods and then, you know, how and why your outlet might choose to report or not report on it. And being pretty upfront about a lot of that could be very effective. 

ROBBINS: That’s an—and I believe that it’s going to be all Hunter Biden all the time once the Judiciary Committee is seated. And there may be there there and it may be a completely legitimate news story. 

LOPEZ: Right. 

ROBBINS: And so we will see. That’s very helpful and quite challenging. 

Chris Joyner from the Atlanta Journal-Constitution. Chris, do you want to ask your question? Or it’s a comment. 

Q: Sure. Thanks. This, as I said, is less of a question, more of a comment. But, you know, I find that readers who are vulnerable to disinformation aren’t reading my newspaper. You know, they’re—readers have become so siloed that they go to their own, you know, information sources that reinforce their own opinions. I’m wondering, you know, if—what’s our role as reporters or news organizations if those people who are vulnerable are not actually coming to our site? How do we handle that? 

LOPEZ: Yeah. And definitely a challenge I’ve heard a lot of folks ask about as well. I mean, I think, you know, obviously you can’t sort of force people to come to your website and seek out your information. But I do think that can go to the point of connecting to other parts of the community that may have a different reach. You know, figuring out if there are ways that, you know, you can maybe reach communities, you know, through libraries or through faith communities, and that you might be able to partner with in some way. You know, I think that there are—you know, in our conversations, there are—we’ve had quite a few conversations with librarians.  

We’re PEN America, so we like libraries. And, you know, librarians are still very trusted figures, most of the time, in their communities. They’re becoming increasingly politicized by external narratives as well. But, you know, they are people that folks go to for information. So I think thinking about, you know, how—is there information you can make available to librarians that might enable some of those resources to get out into the community in different ways? But it really is challenging.  

And I think, you know, one of the things that—you know, was one of the researchers we spoke to for our most recent report talked about the fact that if you are kind of immersed in a certain media ecosystem that even if you go to read other sources, you’re still reading them through the lens of sort of your primary media consumptions and your—and the sort of primary narratives that you’re hearing. So it also—you know, it can be very challenging to kind of breakthrough that.  

But I do think, you know, there is, as I said, I think, a significant portion of the population right now that is really just kind of seeking and trying to figure out what information can be trusted and what sources can be trusted. And the more kind of public engagement you can do to connect with some of those folks and build some trust and bring them, you know, potentially more into your orbit, you know, I think is an important thing to at least attempt. 

ROBBINS: Thanks. Alex Hargrave from the Buffalo Bulletin in Wyoming, do you want to ask your question? 

Well, Alex’s question is: My name is Alex and I work for a newspaper in a community of around 5,000 in Wyoming. Disinformation we encounter primarily comes on Facebook, in community groups and comment sections. For example, there was somehow a rumor, untrue, that the Game and Fish Department was moving grizzlies into our area from the greater Yellowstone ecosystem. How much do we respond to false claims, either on social media or in our publication? What is the most effective way to do so? 

LOPEZ: Yeah. So, I mean, I think this is something that is always sort of a question is, you know, do you kind of take it on? (Laughs.) And if so, how? And, you know, as I said, I think getting a sense of whether something is, you know, getting a lot of engagement—if something is getting a lot of engagement, and it’s false, then I think taking it on and countering that with factual information is really important. If something is maybe going a little bit under the radar and it’s incorrect but it doesn’t seem that people are paying that much of attention to it, then it can be better to just kind of leave it be.  

You know, we have guidance for people on how to—you know, how to deal with, like, a family member who might post false information or share false information to the family chat. And, you know, again, it’s kind of the case that if something—if somebody has kind of just posted something, you know, maybe you want to kind of see—you know, if it’s a friend or a family member, you might go ahead and say something to them at that point, because maybe they’ll decide to take it down. But if it’s something that, you know, is out there and hasn’t gotten that much engagement, then its potential for harm is relatively reduced. 

You know, a lot of disinformation is out there but it doesn’t go viral, and it doesn’t have the same impact. It’s a relatively small amount of disinformation that really, you know, has the most significant impact. And so I think being a little bit selective and also, you know, our outlet doesn’t have the capacity necessarily to take on everything that’s out there. So, you know, you can—you can pick and choose a bit. And then, you know, I think it depends a little bit on the source. If there’s—you know, if it’s something, you know, who you think is sharing something by accident, you know, a lot of what we’re talking about is also misinformation that may start out with the intent to deceive people. But the people might just be sharing because they don’t realize it’s false. You know, that can be a little bit easier to address. 

But I think, again, sort of responding with facts, responding with credible information from sources that people trust—even if you’re not sure if they might trust them. You know, I think a lot of peoples do actually have relatively high levels of trust in, you know, local institutions, in local government outlets. And so I think there—you know, there is some ability to bring, you know, whether it’s the local election official or the local housing official, or whoever it is, to bring their voices into the conversation can actually be an effective way to counter some of that disinformation as well. 

ROBBINS: You know, I was—I was struck when you talked about the pre-bunking issue, that—to how one makes—and this is an interesting news judgment. How you make a decision when to jump on something or whether it’s better to let it lie because you don’t want to add more fuel to the fire—using as many cliches as I can possibly do in one sentence. And, you know, we’re good at news judgment as editors, but sort of news judgment is—it’s different to say something you’d prefer it didn’t capture people’s eyes. It might be really useful for news organizations if there were some sort of central hub of people monitoring across the country when things were taking off, to warn people.  

Almost an early warning system. It’s one thing in a community like Alex’s, when she will know, you know, and have a sense of whether something’s taking off in her community. But the question about Hunter Biden is a good one that is going to have more of a national resonance. And what’s taking off, and how it’s taking off, and what particular aspect of it, that would be a huge service, I would think. And that would be the sort of thing that might have to have people on Telegram or getting, you know, almost investigative in encrypted platforms. And that would be really quite helpful to warn people, when is something taking off? Because the pre-bunking would be a major challenge. 

LOPEZ: Yeah, I think that’s true. And I think, you know, there is a lot that can be very effectively done at the community level. And people do feel they’re, you know, attuned to issues that are arising at that level. And I think a lot of the disinformation, even some of the, you know, kind of larger-scale stuff about the big lie, it actually, you know, a lot of the actual examples of things that people are hearing about might be, you know, related to, somebody gave the example of, like, you know, when the absentee ballots were dropped out at a voting location, you know, they were described on somebody’s Facebook post as, like, ballots being dumped behind a building, and made to sound very questionable. And so, you know, there is a lot of—the local pre-bunking is really important.  

But I do think in terms of, you know, tracking kind of what are some of the national issues that are likely to—that are either likely to spark disinformation or just kind of confusion and questions among the public, or where we start to see narratives emerging. I do think there’s probably, you know, more we could do to, you know, identify ways that journalists can kind of track that more easily, and see some of that as it’s starting to emerge. 

ROBBINS: So Andrew, you have a very depressing comment. Do you want to make it quickly? (Laughs.) So Andrew Abel’s comment is— 

Q: Yeah, I’ll let you read it. I’ll let you read it. You probably have a better microphone. Why don’t you go ahead? 

ROBBINS: Well, Andrew Abel, how is an editor and report at the Mercersburg Pennsylvania Journal says the problem he sees in his community is that disinformation is keyed to the struggle for power. And that people in his rural community feel powerless. Many are not interested in accurate reporting. The question is merely what messages empower. And he wonders if the approaches we currently use to counter disinformation are based on the faulty assumption that truth matters equally across social groups. 

LOPEZ: Yeah. (Laughs.) You know, I think—I mean, I think the answer’s a little bit in your question, right? I mean, I think that people—the fact that people feel powerless makes them, to some degree, you know, more vulnerable to being targeted with disinformation, because they are looking for things that give them a sense that they have power and that they have, you know, again, the ability to kind of determine—to better understand COVID themselves, than the CDC does, or something like that. And, you know, again, I think it ties into the sense of anxiety that people have just generally about society and the world we live in right now.  

And so, you know, it’s not—again, I repeat the point about kind of coming at this with empathy. This is not about people, you know, just being uneducated or anything in particular. It’s really just that we’re all vulnerable to disinformation. Any of us could be duped. Probably most of us have been at some point. And but people are exploiting people’s, you know, sense of anxiety, and people’s sense of powerlessness in a really kind of traumatic moment. And so I think finding—you know, finding ways to make people feel like they, you know, have some ability to make these decisions themselves, to assess truth from falsehood.  

I mean, that—as PEN America, we don’t believe the solution to disinformation is censorship, right? (Laughs.) We’re not all about taking everything down. That’s bad. We’re about empowering people to, you know, have the tools and the knowledge to assess the information that they’re consuming and make informed decisions. And I think that’s a really critical way to frame it as well. And even talking about disinformation can be very fraught for a lot of people at this point. But if you’re talking about, you know, empowering people as information consumers, talking about access to credible information, I think that can be a much more effective narrative that helps people feel they have a sense of power. 

I don’t think we’ve really reckoned with the fact that people consume information now in a completely different way than ever before in human history, and that that’s happened in the last fifteen years. And we haven’t really adjusted our lives to it significantly. So, you know, I think it’s very real, what we’re experiencing right now. And it’s understandable. And so I think we have to, you know, help people feel that they can be part of the solution as well. 

ROBBINS: Well, Summer, I wanted to thank you. And we’re going to turn it back to Irina, but just keeping in mind—and thank everyone for coming in strong in the end with questions. And we’re going to share all sorts of links with you that Summer’s going to share with us. And hope that you will share with us any questions that you have, and suggestions of support that you can use in your newsroom. So, Summer, thank you so much for doing this. And back to Irina. 

FASKIANOS: Thank you very much, Carla and Summer. This was a really good conversation. And thanks to all of you. We couldn’t get to all of your comments, but we did the best that we could. As Carla said, we will send out the link to the webinar and transcript, and resources. You can follow Summer on Twitter at @summerelopez and Carla at @robbinscarla. And as always, we encourage you to visit CFR.org, ForeignAffairs.com, and ThinkGlobalHealth.org for the latest developments and analysis on international trends and how they are affecting the U.S. And please do write us to share suggestions for future webinar topics or speakers. You can email us at [email protected]. So thank you, again, for today’s conversation.  

(END) 

Top Stories on CFR

Syria

China

Zoe Liu, the Maurice R. Greenberg Senior Fellow for China Studies at CFR, sits down with James M. Lindsay to discuss how Trump’s victory is being viewed in China and what his presidency will mean for the future of U.S.-China economic relations. This episode is the seventh in a special TPI series on the U.S. 2025 presidential transition and is supported by the Carnegie Corporation of New York.

France

The fall of the French government, along with political uncertainty in Germany, has upped the pressure on President Emmanuel Macron amid growing European tensions over migration, Ukraine, and energy policy.